A limited memory BFGS-type method for large-scale unconstrained optimization
نویسندگان
چکیده
منابع مشابه
A limited memory adaptive trust-region approach for large-scale unconstrained optimization
This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...
متن کاملOn the limited memory BFGS method for large scale optimization
We study the numerical performance of a limited memory quasi Newton method for large scale optimization which we call the L BFGS method We compare its performance with that of the method developed by Buckley and LeNir which combines cyles of BFGS steps and conjugate direction steps Our numerical tests indicate that the L BFGS method is faster than the method of Buckley and LeNir and is better a...
متن کاملa limited memory adaptive trust-region approach for large-scale unconstrained optimization
this study concerns with a trust-region-based method for solving unconstrained optimization problems. the approach takes the advantages of the compact limited memory bfgs updating formula together with an appropriate adaptive radius strategy. in our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-newt...
متن کاملA regularized limited-memory BFGS method for unconstrained minimization problems
The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...
متن کاملLimited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization
Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expandi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computers & Mathematics with Applications
سال: 2008
ISSN: 0898-1221
DOI: 10.1016/j.camwa.2008.01.028